Code
remotes::install_github("mlverse/chattr")Tony Duan
September 19, 2024
Install Positron from our Releases page.
Currently, Positron is producing pre-release builds from a continuous integration (CI) system for macOS, Windows, and Linux. These pre-release builds are tagged with a version number here on our Github repository. Select the build you want to download, then click on Assets and download the .dmg (for Mac), .exe (for Windows), or .deb (for Linux) file.
https://github.com/posit-dev/positron
https://github.com/mlverse/chattr
download instruction https://mlverse.github.io/chattr/articles/backend-llamagpt.html#installation
1 Download model
2 Download model file Vicuna 7b v1.1: ggml-vicuna-7b-1.1-q4_2.bin
https://github.com/kuvaus/LlamaGPTJ-chat?tab=readme-ov-file#gpt-j-llama-and-mpt-models
3 Build model
https://www.youtube.com/watch?v=nvTJyQFsgCc
https://reg.conf.posit.co/flow/posit/positconf24/publiccatalog/page/publiccatalog?search=&tab.day=20240813
---
title: "Positron IDE"
author: "Tony Duan"
date: "2024-09-19"
categories: [Tool]
execute:
warning: false
error: false
format:
html:
toc: true
code-fold: show
code-tools: true
number-sections: true
code-block-bg: true
code-block-border-left: "#31BAE9"
image: "images/positron.png"
---
# Download
Install Positron from our Releases page.
Currently, Positron is producing pre-release builds from a continuous integration (CI) system for macOS, Windows, and Linux. These pre-release builds are tagged with a version number here on our Github repository. Select the build you want to download, then click on Assets and download the .dmg (for Mac), .exe (for Windows), or .deb (for Linux) file.
https://github.com/posit-dev/positron
# using Copilot with {chattr} R Package
https://github.com/mlverse/chattr
```{r}
#| eval: false
remotes::install_github("mlverse/chattr")
```
## using local model LlamaGPT-Chat
download instruction https://mlverse.github.io/chattr/articles/backend-llamagpt.html#installation
1 Download model
```{bash}
#| eval: false
git clone --recurse-submodules https://github.com/kuvaus/LlamaGPTJ-chat
cd LlamaGPTJ-chat
```
2 Download model file Vicuna 7b v1.1: ggml-vicuna-7b-1.1-q4_2.bin
https://github.com/kuvaus/LlamaGPTJ-chat?tab=readme-ov-file#gpt-j-llama-and-mpt-models
3 Build model
```{bash}
#| eval: false
brew install cmake
mkdir build
cd build
cmake ..
cmake --build . --parallel
```
```{r}
#| eval: false
library(chattr)
chattr_use("llamagpt")
```
```{r}
#| eval: false
chattr_defaults()
```
```{r}
#| eval: false
chattr_test()
```
```{r}
#| eval: false
chattr_app()
```
# write a fuction call gogo that print out 10
## using ChatGPT
## using ChatGPT inside R
https://www.youtube.com/watch?v=nvTJyQFsgCc
# Reference
https://reg.conf.posit.co/flow/posit/positconf24/publiccatalog/page/publiccatalog?search=&tab.day=20240813